Metric Entropy of High Dimensional Distributions
نویسندگان
چکیده
Let Fd be the collection of all d-dimensional probability distribution functions on [0, 1]d, d ≥ 2. The metric entropy of Fd under the L2([0, 1]d) norm is studied. The exact rate is obtained for d = 1, 2 and bounds are given for d > 3. Connections with small deviation probability for Brownian sheets under the sup-norm are established.
منابع مشابه
Ivestigation of Entropy Generation in 3-D Laminar Forced Convection Flow over a Backward Facing Step with Bleeding
A numerical investigation of entropy generation in laminar forced convection of gas flow over a backward facing step in a horizontal duct under bleeding condition is presented. For calculation of entropy generation from the second law of thermodynamics in a forced convection flow, the velocity and temperature distributions are primary needed. For this purpose, the three-dimensional Cartesian co...
متن کاملEntropy of a semigroup of maps from a set-valued view
In this paper, we introduce a new entropy-like invariant, named Hausdorff metric entropy, for finitely generated semigroups acting on compact metric spaces from a set-valued view and study its properties. We establish the relation between Hausdorff metric entropy and topological entropy of a semigroup defined by Bis. Some examples with positive or zero Hausdorff metric entropy are given. Moreov...
متن کاملEntropy Estimate For High Dimensional Monotonic Functions
We establish upper and lower bounds for the metric entropy and bracketing entropy of the class of d-dimensional bounded monotonic functions under L norms. It is interesting to see that both the metric entropy and bracketing entropy have different behaviors for p < d/(d − 1) and p > d/(d − 1). We apply the new bounds for bracketing entropy to establish a global rate of convergence of the MLE of ...
متن کاملar X iv : 0 70 6 . 06 06 v 1 [ m at h . PR ] 5 J un 2 00 7 On the geometry of generalized Gaussian distributions ∗
In this paper we consider the space of those probability distributions which maximize the q-Rényi entropy. These distributions have the same parameter space for every q, and in the q = 1 case these are the normal distributions. Some methods to endow this parameter space with Riemannian metric is presented: the second derivative of the q-Rényi entropy, Tsallis-entropy and the relative entropy gi...
متن کاملA Locally Adaptive Normal Distribution
The multivariate normal density is a monotonic function of the distance to the mean, and its ellipsoidal shape is due to the underlying Euclidean metric. We suggest to replace this metric with a locally adaptive, smoothly changing (Riemannian) metric that favors regions of high local density. The resulting locally adaptive normal distribution (LAND) is a generalization of the normal distributio...
متن کامل